17 research outputs found

    Energy-Efficient Load Balancing Algorithm for Workflow Scheduling in Cloud Data Centers Using Queuing and Thresholds

    Get PDF
    Cloud computing is a rapidly growing technology that has been implemented in various fields in recent years, such as business, research, industry, and computing. Cloud computing provides different services over the internet, thus eliminating the need for personalized hardware and other resources. Cloud computing environments face some challenges in terms of resource utilization, energy efficiency, heterogeneous resources, etc. Tasks scheduling and virtual machines (VMs) are used as consolidation techniques in order to tackle these issues. Tasks scheduling has been extensively studied in the literature. The problem has been studied with different parameters and objectives. In this article, we address the problem of energy consumption and efficient resource utilization in virtualized cloud data centers. The proposed algorithm is based on task classification and thresholds for efficient scheduling and better resource utilization. In the first phase, workflow tasks are pre-processed to avoid bottlenecks by placing tasks with more dependencies and long execution times in separate queues. In the next step, tasks are classified based on the intensities of the required resources. Finally, Particle Swarm Optimization (PSO) is used to select the best schedules. Experiments were performed to validate the proposed technique. Comparative results obtained on benchmark datasets are presented. The results show the effectiveness of the proposed algorithm over that of the other algorithms to which it was compared in terms of energy consumption, makespan, and load balancing

    Applications and Algorithms for Inference of Huge Phylogenetic Trees: a Review

    Get PDF
    Abstract Phylogenetics enables us to use various techniques to extract evolutionary relationships from sequence analysis. Most of the phylogenetic analysis techniques produce phylogenetic trees that represent relationship between any set of species or their evolutionary history. This article presents a comprehensive survey of the applications and the algorithms for inference of huge phylogenetic trees and also gives the reader an overview of the methods currently employed for the inference of phylogenetic trees. A comprehensive comparison of the methods and algorithms is presented in this paper

    Critical Review of Blockchain Consensus Algorithms: challenges and opportunities

    No full text
    Blockchain is a distributed ledger in which transactions are grouped in blocks linked by hash pointers. Blockchain-based solutions provide trust and privacy because of the resistance to the inconsistency of data and advanced cryptographic features. In various fields, blockchain technology has been implemented to ensure transparency, verifiability, interoperability, governance, and management of information systems.  Processing large volumes of data being generated through emerging technologies is a big issue. Many researchers have used Blockchain in various fields integrated with IoT, i.e., industry 4.0, biomedical, health, genomics, etc. Blockchain has the attributes of decentralization, solidness, security, and immutability with a possibility to secure the system design for transmission and storage of data. The purpose of the consensus protocols is to keep up the security and effectiveness of the blockchain network. Utilizing the correct protocol enhances the performance of the blockchain applications. This article presents essential principles and attributes of consensus algorithms to show the applications, challenges, and opportunities of blockchain technology. Moreover, future research directions are also presented to choose an appropriate consensus algorithm to enhance the performance of Blockchain based application

    Critical Review of Blockchain Consensus Algorithms: challenges and opportunities

    No full text
    Blockchain is a distributed ledger in which transactions are grouped in blocks linked by hash pointers. Blockchain-based solutions provide trust and privacy because of the resistance to the inconsistency of data and advanced cryptographic features. In various fields, blockchain technology has been implemented to ensure transparency, verifiability, interoperability, governance, and management of information systems.  Processing large volumes of data being generated through emerging technologies is a big issue. Many researchers have used Blockchain in various fields integrated with IoT, i.e., industry 4.0, biomedical, health, genomics, etc. Blockchain has the attributes of decentralization, solidness, security, and immutability with a possibility to secure the system design for transmission and storage of data. The purpose of the consensus protocols is to keep up the security and effectiveness of the blockchain network. Utilizing the correct protocol enhances the performance of the blockchain applications. This article presents essential principles and attributes of consensus algorithms to show the applications, challenges, and opportunities of blockchain technology. Moreover, future research directions are also presented to choose an appropriate consensus algorithm to enhance the performance of Blockchain based application

    A Resource Utilization Prediction Model for Cloud Data Centers Using Evolutionary Algorithms and Machine Learning Techniques

    No full text
    Cloud computing has revolutionized the modes of computing. With huge success and diverse benefits, the paradigm faces several challenges as well. Power consumption, dynamic resource scaling, and over- and under-provisioning issues are challenges for the cloud computing paradigm. The research has been carried out in cloud computing for resource utilization prediction to overcome over- and under-provisioning issues. Over-provisioning of resources consumes more energy and leads to high costs. However, under-provisioning induces Service Level Agreement (SLA) violation and Quality of Service (QoS) degradation. Most of the existing mechanisms focus on single resource utilization prediction, such as memory, CPU, storage, network, or servers allocated to cloud applications but overlook the correlation among resources. This research focuses on multi-resource utilization prediction using Functional Link Neural Network (FLNN) with hybrid Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). The proposed technique is evaluated on Google cluster traces data. Experimental results show that the proposed model yields better accuracy as compared to traditional techniques

    A Lightweight Authentication and Authorization Framework for Blockchain-Enabled IoT Network in Health-Informatics

    No full text
    Blockchain and IoT are being deployed at a large scale in various fields including healthcare for applications such as secure storage, transactions, and process automation. IoT devices are resource-constrained, have no capability of security and self-protection, and can easily be hacked or compromised. Furthermore, Blockchain is an emerging technology with immutability features which provide secure management, authentication, and guaranteed access control to IoT devices. IoT is a cloud-based internet service in which processing and collection of user’s data are accomplished remotely. Smart healthcare also requires the facility to provide the diagnosis of patients located remotely. The smart health framework faces critical issues such as data security, costs, memory, scalability, trust, and transparency between different platforms. Therefore, it is important to handle data integrity and privacy as the user’s authenticity is in question due to an open internet environment. Several techniques are available that primarily focus on resolving security issues i.e., forgery, timing, denial of service and stolen smartcard attacks, etc. Blockchain technology follows the rules of absolute privacy to identify the users associated with transactions. The motivation behind the use of Blockchain in health informatics is the removal of the centralized third party, immutability, improved data sharing, enhanced security, and reduced overhead costs in distributed applications. Healthcare informatics has some specific requirements associated with the security and privacy along with the additional legal requirements. This paper presents a novel authentication and authorization framework for Blockchain-enabled IoT networks using a probabilistic model. The proposed framework makes use of random numbers in the authentication process which is further connected through joint conditional probability. Hence, it establishes a secure connection among IoT devices for further data acquisition. The proposed model is validated and evaluated through extensive simulations using the AVISPA tool and the Cooja simulator, respectively. Experimental results analyses show that the proposed framework provides robust mutual authenticity, enhanced access control, and lowers both the communication and computational overhead cost as compared to others

    A Resource Utilization Prediction Model for Cloud Data Centers Using Evolutionary Algorithms and Machine Learning Techniques

    No full text
    Cloud computing has revolutionized the modes of computing. With huge success and diverse benefits, the paradigm faces several challenges as well. Power consumption, dynamic resource scaling, and over- and under-provisioning issues are challenges for the cloud computing paradigm. The research has been carried out in cloud computing for resource utilization prediction to overcome over- and under-provisioning issues. Over-provisioning of resources consumes more energy and leads to high costs. However, under-provisioning induces Service Level Agreement (SLA) violation and Quality of Service (QoS) degradation. Most of the existing mechanisms focus on single resource utilization prediction, such as memory, CPU, storage, network, or servers allocated to cloud applications but overlook the correlation among resources. This research focuses on multi-resource utilization prediction using Functional Link Neural Network (FLNN) with hybrid Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). The proposed technique is evaluated on Google cluster traces data. Experimental results show that the proposed model yields better accuracy as compared to traditional techniques

    Patient Mortality Prediction and Analysis of Health Cloud Data Using a Deep Neural Network

    No full text
    Cloud computing plays a vital role in healthcare as it can store a large amount of data known as big data. In the current emerging era of computing technology, big data analysis and prediction is a challenging task in the healthcare industry. Healthcare data are very crucial for the patient as well as for the respective healthcare services provider. Several healthcare industries adopted cloud computing for data storage and analysis. Incredible progress has been achieved in making combined health records available to data scientists and clinicians for healthcare research. However, big data in health cloud informatics demand more robust and scalable solutions to accurately analyze it. The increasing number of patients is putting high pressure on healthcare services worldwide. At this stage, fast, accurate, and early clinical assessment of the disease severity is vital. Predicting mortality among patients with a variety of symptoms and complications is difficult, resulting inaccurate and slow prediction of the disease. This article presents a deep learning based model for the prediction of patient mortality using the Medical Information Mart for Intensive Care III (MIMIC-III) dataset. Different parameters are used to analyze the proposed model, i.e., accuracy, F1 score, recall, precision, and execution time. The results obtained are compared with state-of-the-art models to test and validate the proposed model. Moreover, this research suggests a simple and operable decision rule to quickly predict patients at the highest risk, allowing them to be prioritized and potentially reducing the mortality rate

    Knowledge-Based Framework for Selection of Genomic Data Compression Algorithms

    No full text
    The development of new sequencing technologies has led to a significant increase in biological data. The exponential increase in data has exceeded increases in computing power. The storage and analysis of the huge amount of data poses challenges for researchers. Data compression is used to reduce the size of data, which ultimately reduces the cost of data transmission over the Internet. The field comprises experts from two domains, i.e., computer scientists and biological scientists. Computer scientists develop programs to solve biological problems, whereas biologists use these programs. Computer programs need parameters that are usually provided as input by the users. Users need to know different parameters while operating these programs. Users need to configure parameters manually, which leads to being more time-consuming and increased chances of errors. The program selected by the user may not be an efficient solution according to the desired parameter. This paper focuses on automatic program selection for biological data compression. Forward chaining is employed to develop an expert system to solve this problem. The system takes different parameters related to compression programs from the user and selects compression programs according to the desired parameters. The proposed solution is evaluated by testing it with benchmark datasets using programs available in the literature

    A Robust Framework for Real-Time Iris Landmarks Detection Using Deep Learning

    No full text
    Iris detection and tracking plays a vital role in human–computer interaction and has become an emerging field for researchers in the last two decades. Typical applications such as virtual reality, augmented reality, gaze detection for customer behavior, controlling computers, and handheld embedded devices need accurate and precise detection of iris landmarks. A significant improvement has been made so far in iris detection and tracking. However, iris landmarks detection in real-time with high accuracy is still a challenge and a computationally expensive task. This is also accompanied with the lack of a publicly available dataset of annotated iris landmarks. This article presents a benchmark dataset and a robust framework for the localization of key landmark points to extract the iris with better accuracy. A number of training sessions have been conducted for MobileNetV2, ResNet50, VGG16, and VGG19 over an iris landmarks dataset, and ImageNet weights are used for model initialization. The Mean Absolute Error (MAE), model loss, and model size are measured to evaluate and validate the proposed model. Results analyses show that the proposed model outperforms other methods on selected parameters. The MAEs of MobileNetV2, ResNet50, VGG16, and VGG19 are 0.60, 0.33, 0.35, and 0.34; the average decrease in size is 60%, and the average reduction in response time is 75% compared to the other models. We collected the images of eyes and annotated them with the help of the proposed algorithm. The generated dataset has been made publicly available for research purposes. The contribution of this research is a model with a more diminutive size and the real-time and accurate prediction of iris landmarks, along with the provided dataset of iris landmark annotations
    corecore